Optimal Training Algorithms and their Relation to Backpropagation
نویسندگان
چکیده
We derive global H 1 optimal training algorithms for neural networks. These algorithms guarantee the smallest possible prediction error energy over all possible disturbances of xed energy, and are therefore robust with respect to model uncertainties and lack of statistical information on the exogenous signals. The ensuing es-timators are innnite-dimensional, in the sense that updating the weight vector estimate requires knowledge of all previous weight esimates. A certain nite-dimensional approximation to these es-timators is the backpropagation algorithm. This explains the local H 1 optimality of backpropagation that has been previously demonstrated.
منابع مشابه
What Size Neural Network Gives Optimal Generalization? Convergence Properties of Backpropagation
One of the most important aspects of any machine learning paradigm is how it scales according to problem size and complexity. Using a task with known optimal training error, and a pre-specified maximum number of training updates, we investigate the convergence of the backpropagation algorithm with respect to a) the complexity of the required function approximation, b) the size of the network in...
متن کاملDiscovery of Optimal Backpropagation Learning Rules Using Genetic Programming
The development of the backpropagation learning rule has been a landmark in neural networks. It provides a computational method for training multilayer networks. Unfortunately, backpropagation suffers from several problems. In this paper, a new technique based upon Genetic Programming (GP) is proposed to overcome some of these problems. We have used GP to discover new supervised learning algori...
متن کاملSupervised Training Using Global Search Methods
Supervised learning in neural networks based on the popular backpropagation method can be often trapped in a local minimum of the error function. The class of backpropagation-type training algorithms includes local minimization methods that have no mechanism that allows them to escape the influence of a local minimum. The existence of local minima is due to the fact that the error function is t...
متن کاملGlobal Search Methods for Neural Network Training
In many cases the supervised neural network training using a backpropagation based learning rule can be trapped in a local minimum of the error function. These training algorithms are local minimization methods and have no mechanism that allows them to escape the in uence of a local minimum. The existence of local minima is due to the fact that the error function is the superposition of nonline...
متن کاملAn Analysis of Artificial Neural Network Pruning Algorithms in Relation to Land Cover Classification Accuracy
Artificial neural networks (ANNs) have been widely used for many classification purposes and generally proved to be more powerful than conventional statistical techniques. However, the use of ANNs requires decisions on the part of the user which may affect the accuracy of the resulting classification. One of these decisions concerns the determination of the optimum network structure for a parti...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1994